Approximation errors in truncated dimensional decompositions

نویسنده

  • Sharif Rahman
چکیده

Abstract. The main theme of this paper is error analysis for approximations derived from two variants of dimensional decomposition of a multivariate function: the referential dimensional decomposition (RDD) and analysisof-variance dimensional decomposition (ADD). New formulae are presented for the lower and upper bounds of the expected errors committed by bivariately and arbitrarily truncated RDD approximations when the reference point is selected randomly, thereby facilitating a means for weighing RDD against ADD approximations. The formulae reveal that the expected error from the S-variate RDD approximation of a function of N variables, where 0 ≤ S < N < ∞, is at least 2S+1 times greater than the error from the S-variate ADD approximation. Consequently, ADD approximations are exceedingly more precise than RDD approximations. The analysis also finds the RDD approximation to be sub-optimal for an arbitrarily selected reference point, whereas the ADD approximation always results in minimum error. Therefore, the RDD approximation should be used with caution.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Additive models in high dimensions

We discuss some aspects of approximating functions on high-dimensional data sets with additive functions or ANOVA decompositions, that is, sums of functions depending on fewer variables each. It is seen that under appropriate smoothness conditions, the errors of the ANOVA decompositions are of order O(n) for indendent predictor variables and approximations using sums of functions of up to m var...

متن کامل

ReportCONCEPT DECOMPOSITIONS FOR LARGE SPARSE TEXT DATA USINGCLUSTERINGInderjit

This report has been submitted for publication outside of IBM and will probably be copyrighted if accepted for publication. It has been issued as a Research Report for early dissemination of its contents. In view of the transfer of copyright to the outside publisher, its distribution outside of IBM prior to publication should be limited to peer communications and speciic requests. After outside...

متن کامل

Infinite-dimensional versions of the primary, cyclic and Jordan decompositions

The famous primary and cyclic decomposition theorems along with the tightly related rational and Jordan canonical forms are extended to linear spaces of infinite dimensions with counterexamples showing the scope of extensions.

متن کامل

Principal Component Analysis for Distributed Data Sets with Updating

Identifying the patterns of large data sets is a key requirement in data mining. A powerful technique for this purpose is the principal component analysis (PCA). PCA-based clustering algorithms are effective when the data sets are found in the same location. In applications where the large data sets are physically far apart, moving huge amounts of data to a single location can become an impract...

متن کامل

Addendum to: "Infinite-dimensional versions of the primary, cyclic and Jordan decompositions", by M. Radjabalipour

In his paper mentioned in the title, which appears in the same issue of this journal, Mehdi Radjabalipour derives the cyclic decomposition of an algebraic linear transformation. A more general structure theory for linear transformations appears in Irving Kaplansky's lovely 1954 book on infinite abelian groups. We present a translation of Kaplansky's results for abelian groups into the terminolo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Math. Comput.

دوره 83  شماره 

صفحات  -

تاریخ انتشار 2014